Numerical Optimization Lecture 13: Conjugate Gradient Method

نویسنده

  • Sangkyun Lee
چکیده

We still need to show that the directions p0, p1, . . . , pn−1 generated by Algorithms 1 and (2) are conjugate wrt A. If so, then by Theorem 11.3, this algorithm will terminate in n steps. The next theorem shows this property, along with two other important properties: (i) the residuals ri are mutually orthogonal, and (ii) each pk and rk is contained in the Krylov subspace of degree k for r0, defined by K(r0; k) = span{r0, Ar0, . . . , Ar0}. To understand its relation to CG, we can see from the fact that r0 = Ax0 − b and A−1 can be written due to the Cayley-Hamilton theorem, A−1 ≈ (−1) k−1

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

On the hybrid conjugate gradient method for solving fuzzy optimization problem

In this paper we consider a constrained optimization problem where the objectives are fuzzy functions (fuzzy-valued functions). Fuzzy constrained Optimization (FO) problem plays an important role in many fields, including mathematics, engineering, statistics and so on. In the other side, in the real situations, it is important to know how may obtain its numerical solution of a given interesting...

متن کامل

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

DS - GA 3001 . 03 : Extended Syllabus Lecture 10 Optimization and Computational Linear Algebra for Data Science ( Fall 2016 )

[1] S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004. [2] Carlos Fernandez-Granda, Lecture Notes of “Optimization-based Data Analysis”, available at http: //www.cims.nyu.edu/~cfgranda/pages/OBDA_spring16/notes.html, 2016. [3] Carlos Fernandez-Granda, Lecture Notes of DSGA1002, available at http://www.cims.nyu.edu/ ~cfgranda/pages/DSGA1002_fall15/notes.html, 201...

متن کامل

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014